Divergence L 2-Coercivity Inequalities
نویسندگان
چکیده
منابع مشابه
Coercivity conditions and variational inequalities
Various coercivity conditions appear in the literature in order to guarantee solutions for the Variational Inequality Problem. We show that these conditions are equivalent to each other and that they are not only sufficient, but also necessary for the set of solutions to be non-empty and bounded.
متن کاملNested Inequalities Among Divergence Measures
In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more ...
متن کاملRefinement Inequalities among Symmetric Divergence Measures
There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...
متن کاملGeneralized Symmetric Divergence Measures and Inequalities
, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...
متن کاملRelative Divergence Measures and Information Inequalities
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Numerical Functional Analysis and Optimization
سال: 2006
ISSN: 0163-0563,1532-2467
DOI: 10.1080/01630560600790777